Telegram Group & Telegram Channel
✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview
Please open Telegram to view this post
VIEW IN TELEGRAM



tg-me.com/machinelearning_interview/1774
Create:
Last Update:

✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview

BY Machine learning Interview




Share with your friend now:
tg-me.com/machinelearning_interview/1774

View MORE
Open in Telegram


Machine learning Interview Telegram | DID YOU KNOW?

Date: |

Tata Power whose core business is to generate, transmit and distribute electricity has made no money to investors in the last one decade. That is a big blunder considering it is one of the largest power generation companies in the country. One of the reasons is the company's huge debt levels which stood at ₹43,559 crore at the end of March 2021 compared to the company’s market capitalisation of ₹44,447 crore.

Telegram Gives Up On Crypto Blockchain Project

Durov said on his Telegram channel today that the two and a half year blockchain and crypto project has been put to sleep. Ironically, after leaving Russia because the government wanted his encryption keys to his social media firm, Durov’s cryptocurrency idea lost steam because of a U.S. court. “The technology we created allowed for an open, free, decentralized exchange of value and ideas. TON had the potential to revolutionize how people store and transfer funds and information,” he wrote on his channel. “Unfortunately, a U.S. court stopped TON from happening.”

Machine learning Interview from no


Telegram Machine learning Interview
FROM USA